Combining search directions using gradient flows
نویسندگان
چکیده
-----------------------------The efficient combination of directions is a significant problem in line search methods that either use negative curvature. or wish to include additional information such as the gradient or different approximations to the Newton direction. In thls paper we describe a new procedure to combine several of these directions within an interior-point primal-dual algorithm. Basically. we combine in an efficient manner a modified Newton direction with the gradient of a merit function and a direction of negative curvature. is it exists. We also show that the procedure is well-defined. and it has reasonable theoretical properties regarding the convergence of the method. We also present numerical results from an implementation of the proposed algorithm on a set of small test problems from the CUTE collection.
منابع مشابه
Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
متن کاملConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a de...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملA derivative–free nonmonotone line search technique for unconstrained optimization
A tolerant derivative–free nonmonotone line search technique is proposed and analyzed. Several consecutive increases in the objective function and also non descent directions are admitted for unconstrained minimization. To exemplify the power of this new line search we describe a direct search algorithm in which the directions are chosen randomly. The convergence properties of this random metho...
متن کاملA feasible directions method on combining feasibility with descent for nonlinear constrained optimization
Abstract. In this paper, a modified gradient projection method is proposed to solve the nonlinear constrained optimization problems, where the search direction is obtained by combing feasibility with descent. In addition, it is pointed out that, for linear constrained optimization problems, this method may be simplified and viewed as the modified version to Rosen’s method. The theoretical analy...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Math. Program.
دوره 96 شماره
صفحات -
تاریخ انتشار 2003